Perturbed Iterate SGD for Lipschitz Continuous Loss Functions

نویسندگان

چکیده

This paper presents an extension of stochastic gradient descent for the minimization Lipschitz continuous loss functions. Our motivation is use in non-smooth non-convex optimization problems, which are frequently encountered applications such as machine learning. Using Clarke $$\epsilon $$ -subdifferential, we prove non-asymptotic convergence to approximate stationary point expectation proposed method. From this result, a method with high probability, well asymptotic almost surely developed. results hold under assumption that function Carathéodory everywhere decision variables. To best our knowledge, first analysis these minimal assumptions.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lipschitz Functions of Perturbed Operators

We prove that if f is a Lipschitz function on R, A and B are self-adjoint operators such that rank(A − B) = 1, then f(A) − f(B) belongs to the weak space S1,∞, i.e., sj(A − B) ≤ const(1 + j). We deduce from this result that if A − B belongs to the trace class S1 and f is Lipschitz, then f(A) − f(B) ∈ SΩ, i.e., Pn j=0 sj(f(A) − f(B)) ≤ const log(2 + n). We also obtain more general results about ...

متن کامل

Perturbed Iterate Analysis for Asynchronous Stochastic Optimization

We introduce and analyze stochastic optimization methods where the input to each gradient updateis perturbed by bounded noise. We show that this framework forms the basis of a unified approachto analyze asynchronous implementations of stochastic optimization algorithms. In this framework,asynchronous stochastic optimization algorithms can be thought of as serial methods operatin...

متن کامل

Convergence Analysis of Sampling Methods for Perturbed Lipschitz Functions

In this short note we observe that results of Dennis and Audet extend naturally to a wide variety of deterministic sampling methods. For bound-constrained problems, we show that any method based on coordinate search which includes a sufficiently rich set of directions, for example random directions at each state of the sampling, will, when applied to Lipschitz continuous problems, have cluster ...

متن کامل

Perturbed Smooth Lipschitz Extensions of Uniformly Continuous Functions on Banach Spaces

We show that if Y is a separable subspace of a Banach space X such that both X and the quotient X/Y have Cp-smooth Lipschitz bump functions, and U is a bounded open subset of X, then, for every uniformly continuous function f : Y ∩U → R and every ε > 0, there exists a Cp-smooth Lipschitz function F : X → R such that |F (y)− f(y)| ≤ ε for every y ∈ Y ∩U . If we are given a separable subspace Y o...

متن کامل

Takashi Noiri and Valeriu Popa ON ITERATE MINIMAL STRUCTURES AND M - ITERATE CONTINUOUS FUNCTIONS

We introduce the notion of mIT -structures determined by operators mInt and mCl on an m-space (X,mX). By using mIT -structures, we introduce and investigate a function f : (X,mIT ) → (Y,mY ) called MIT -continuous. As special cases of MIT -continuity, we obtain M -semicontinuity [21] and M -precontinuity [23].

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2022

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-022-02093-0